Derivative-Free Optimization via Classification

نویسندگان

  • Yang Yu
  • Hong Qian
  • Yi-Qi Hu
چکیده

Many randomized heuristic derivative-free optimization methods share a framework that iteratively learns a model for promising search areas and samples solutions from the model. This paper studies a particular setting of such framework, where the model is implemented by a classification model discriminating good solutions from bad ones. This setting allows a general theoretical characterization, where critical factors to the optimization are discovered. We also prove that optimization problems with Local Lipschitz continuity can be solved in polynomial time by proper configurations of this framework. Following the critical factors, we propose the randomized coordinate shrinking classification algorithm to learn the model, forming the RACOS algorithm, for optimization in continuous and discrete domains. Experiments on the testing functions as well as on the machine learning tasks including spectral clustering and classification with Ramp loss demonstrate the effectiveness of RACOS.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sequential Classification-Based Optimization for Direct Policy Search

Direct policy search often results in high-quality policies in complex reinforcement learning problems, which employs some optimization algorithms to search the parameters of the policy for maximizing the its total reward. Classificationbased optimization is a recently developed framework for derivative-free optimization, which has shown to be effective and efficient for non-convex optimization...

متن کامل

A Derivative-Free Optimization Method for Solving Classification Problem

Problem statement: The aim of data classification is to establish rules for the classification of some observations assuming that we have a database, which includes of at least two classes. There is a training set for each class. Those problems occur in a wide range of human activity. One of the most promising ways to data classification is based on methods of mathematical optimization. Approac...

متن کامل

Derivative-Free Methods for Mixed-Integer Constrained Optimization Problems

Methods which do not use any derivative information are becoming popular among researchers, since they allow to solve many real-world engineering problems. Such problems are frequently characterized by the presence of discrete variables, which can further complicate the optimization process. In this paper, we propose derivative-free algorithms for solving continuously differentiable Mixed Integ...

متن کامل

Derivative-Free Optimization Via Proximal Point Methods

Derivative-Free Optimization (DFO) examines the challenge of minimizing (or maximizing) a function without explicit use of derivative information. Many standard techniques in DFO are based on using model functions to approximate the objective function, and then applying classic optimization methods on the model function. For example, the details behind adapting steepest descent, conjugate gradi...

متن کامل

Algorithm xxx: APPSPACK 4.0: Parallel Pattern Search for Derivative-Free Optimization

APPSPACK is software for solving unconstrained and bound constrained optimization problems. It implements an asynchronous parallel pattern search method that has been specifically designed for problems characterized by expensive function evaluations. Using APPSPACK to solve optimization problems has several advantages: No derivative information is needed; the procedure for evaluating the object...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016